On the Tightness of LP Relaxations for Structured Prediction
نویسندگان
چکیده
Structured prediction applications often involve complex inference problems that require the use of approximate methods. Approximations based on linear programming (LP) relaxations have proved particularly successful in this setting, with both theoretical and empirical support. Despite the general intractability of inference, it has been observed that in many real-world applications the LP relaxation is often tight. In this work we propose a theoretical explanation to this striking observation. In particular, we show that learning with LP relaxed inference encourages tightness of training instances. We complement this result with a generalization bound showing that tightness generalizes from train to test data.
منابع مشابه
Train and Test Tightness of LP Relaxations in Structured Prediction
Structured prediction is used in areas such as computer vision and natural language processing to predict structured outputs such as segmentations or parse trees. In these settings, prediction is performed by MAP inference or, equivalently, by solving an integer linear program. Because of the complex scoring functions required to obtain accurate predictions, both learning and inference typicall...
متن کاملConditions beyond treewidth for tightness of higher-order LP relaxations
Linear programming (LP) relaxations are a popular method to attempt to find a most likely configuration of a discrete graphical model. If a solution to the relaxed problem is obtained at an integral vertex then the solution is guaranteed to be exact and we say that the relaxation is tight. We consider binary pairwise models and introduce new methods which allow us to demonstrate refined conditi...
متن کاملBounding the Integrality Distance of LP Relaxations for Structured Prediction
In structured prediction, a predictor optimizes an objective function over a combinatorial search space, such as the set of all image segmentations, or the set of all part-of-speech taggings. Unfortunately, finding the optimal structured labeling—sometimes referred to as maximum a posteriori (MAP) inference—is, in general, NP-hard [12], due to the combinatorial structure of the problem. Many in...
متن کاملCharacterizing Tightness of LP Relaxations by Forbidding Signed Minors
We consider binary pairwise graphical models and provide an exact characterization (necessary and sufficient conditions observing signs of potentials) of tightness for the LP relaxation on the triplet-consistent polytope of the MAP inference problem, by forbidding an odd-K5 (complete graph on 5 variables with all edges repulsive) as a signed minor in the signed suspension graph. This captures s...
متن کاملTightness of LP Relaxations for Almost Balanced Models
Linear programming (LP) relaxations are widely used to attempt to identify a most likely configuration of a discrete graphical model. In some cases, the LP relaxation attains an optimum vertex at an integral location and thus guarantees an exact solution to the original optimization problem. When this occurs, we say that the LP relaxation is tight. Here we consider binary pairwise models and de...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1511.01419 شماره
صفحات -
تاریخ انتشار 2015